Python Job: Senior Data Engineer (online marketplace developme

Job added on

Company

Cobbleweb LTD
France

Location

Remote Position
(From Everywhere/No Office Location)

Job type

Full-Time

Python Job Details

This position is 100% remote

What are we looking for?

At CobbleWeb, we do not simply churn out cookie-cutter products. Our clients rely on us to turn their online marketplace ideas into sustainable businesses. That’s why we offer them a custom user-focused approach that increases their opportunities for success substantially.

Based on the Lean Startup Method (used by Google, Airbnb, Uber and Amazon) we follow a BUILD – MEASURE – LEARN process to help our clients establish product-market fit, gain competitive advantages, 
and grow their businesses.

The golden thread linking each step in our process is data. Without it, we cannot help our clients make informed decisions about their target audience, marketing channels, product features and much more.

That’s where you come in. We are looking for an experienced Data Engineer who will help us create and manage appropriate metrics models for our clients’ marketplace projects. That includes collaborating with our Business Analyst to identify the right metrics for each project and then collecting, managing, and converting raw data into useful information.

Our ideal candidate understands that the metrics models that we build during the Discovery phase of each project go beyond determining what users are doing; they aim to seek the fundamental reason why things exist at all. Your mission is to help our clients discover their business in a way that will constantly evolve their thinking and their products to realise their ultimate vision.

Your metrics models will help our growth hacking efforts, finding the best way to acquire, activate, retain and convert our clients’ user bases. Using the Pirates Metrics Model to measure and analyse our clients’ website or mobile apps, to help us adjust whatever is necessary to improve performance. You are comfortable building and managing data pipelines for technical metrics (track if the product is working as expected and quickly identify technical problems), as well as UX/UI metrics that help us increase audience engagement.

Current projects that you can expect to work on include Nestify, a fast-growing property management platform. We have been asked to implement performance tracking for their employees (via admin and employee dashboards) and identify new business opportunities (cities to focus on, optimal pricing, etc.)

You will also help us build CobbleWeb’s internal communication system and knowledge base known as Umy. This set of internal tools will support our globally distributed company structure.

What you will be doing

  • Design, deliver and continuously test data pipelines that will aggregate data into reports.
  • Collaborate with the team to create innovative proofs-of-concept, pilot projects, minimum viable products, and business cases.
  • Transform data into valuable insights that inform business decisions, making use of our internal data platforms and applying appropriate analytical techniques.
  • Help us to understand our users and serve them better through data, conversations, and active research to hear from them directly.
  • Engineer reliable data pipelines for sourcing, processing, distributing, and storing data in different ways, using data platform infrastructure effectively.
  • Produce and automate delivery of key metrics and KPIs to the business. In some cases, this will mean simply making data available and in others it will constitute developing full reports for end users.
  • Monitor usage of data platforms and work with clients to deprecate reports and data sets that are not needed and create a continuous improvement model for the data.
  • Work with clients to understand data issues, tracing back data lineage and helping the business put appropriate data cleansing and quality processes in place.
  • Work with stakeholders to define and establish data quality rules, definitions and strategies in line with business strategies and goals.
  • Monitor and set standards for data quality.
  • Prioritise data issues.

Job requirements

  • BSc or MS in Computer Science or related technical fields. Equivalent work experience will also be considered.
  • At least 4 years of experience in cloud data engineering roles with a solid understanding of cloud storage and cloud technologies.
  • A strong coding background in either Python or Scala.
  • Experience with Cloud SQL and NoSQL.
  • Excellent SQL skills enabling large scale data transformation and analysis.
  • Experience of developing relational databases based on SQL and data warehousing technologies.
  • Knowledge of the pros and cons of various database technologies like Relational, NoSQL, MPP, and Columnar databases
  • A comprehensive understanding of cloud data warehousing and data transformation (extract, transform and load) processes and supporting technologies such as Google Dataflow, Looker, Amazon Glue, EMR, Azure Data Factory, Data Lake, and other analytics tools.
  • Expert knowledge of Elasticsearch
  • Experience in manipulating data through cleansing, parsing, standardising etc, especially in relation to improving data quality and integrity
  • Proven ability to design Data Models and ETL pipelines that meet the business requirements in the most efficient manner.
  • You have designed and deployed data pipelines and ETL systems for data-at-scale
  • focusing on outcomes and continuous learning.
  • Good data modelling experience to address scale and read/write performance
  • Previous experience of meeting the visualisation, reporting and analytics needs of key business functions through development of presentation and data models
  • Experienced in defining and developing data sets, models and cubes.
  • Knowledge of the emerging technologies that support Business Intelligence, Analytics and Data.
  • You have a curious level-headed approach to problem solving, with a fine eye for detail and the ability to look at the wider business context to spot opportunities for improvement.
  • Passionate about data, and unlocking data for the masses

Recommended

  • Previous experience working with software development companies.
  • An understanding of the platform economy, especially online marketplaces

THIS JOB IS NOT AVAILABLE FOR AGENCIES.

Job Types: Full-time, Contract, Permanent
Contract length: 12 months

Application Question(s):

  • What's your salary expectation per year?
  • What's your country of residence?

Work Location: Remote